# Pre-layer normalization
Efficient Mlm M0.40 801010
This model studies the effectiveness of masking 15% content in masked language modeling, employing pre-layer normalization techniques not currently supported by HuggingFace.
Large Language Model
Transformers

E
princeton-nlp
119
0
Efficient Mlm M0.15
This model investigates the effectiveness of masking 15% of content in masked language modeling, employing a pre-layer normalization approach.
Large Language Model
Transformers

E
princeton-nlp
116
1
Featured Recommended AI Models